Goto

Collaborating Authors

 laplacian matrix


Subspace Projection Methods for Fast Spectral Embeddings of Evolving Graphs

Eini, Mohammad, Karaaslanli, Abdullah, Kalantzis, Vassilis, Traganitis, Panagiotis A.

arXiv.org Machine Learning

Several graph data mining, signal processing, and machine learning downstream tasks rely on information related to the eigenvectors of the associated adjacency or Laplacian matrix. Classical eigendecomposition methods are powerful when the matrix remains static but cannot be applied to problems where the matrix entries are updated or the number of rows and columns increases frequently. Such scenarios occur routinely in graph analytics when the graph is changing dynamically and either edges and/or nodes are being added and removed. This paper puts forth a new algorithmic framework to update the eigenvectors associated with the leading eigenvalues of an initial adjacency or Laplacian matrix as the graph evolves dynamically. The proposed algorithm is based on Rayleigh-Ritz projections, in which the original eigenvalue problem is projected onto a restricted subspace which ideally encapsulates the invariant subspace associated with the sought eigenvectors. Following ideas from eigenvector perturbation analysis, we present a new methodology to build the projection subspace. The proposed framework features lower computational and memory complexity with respect to competitive alternatives while empirical results show strong qualitative performance, both in terms of eigenvector approximation and accuracy of downstream learning tasks of central node identification and node clustering.


Supplementary Materials: Semi-Supervised Contrastive Learning for Deep Regression with Ordinal Rankings from Spectral Seriation

Neural Information Processing Systems

The main result is presented in Theorem 2. According to the definition of the Fiedler vector, we have ( L + L)( f + f) = ( λ + λ)( f + f). We outline the proof below for interested readers. The main result is presented in Theorem 2. We first present Stewart's theorem in Lemma 1 to assist Actual times may differ depending on hardware and environment. We also show the number of model parameters required for each method in Table S3. Hyper-parameters were selected based on a coarse search on the validation set.





Structured Graph Learning Via Laplacian Spectral Constraints

Sandeep Kumar, Jiaxi Ying, Jose Vinicius de Miranda Cardoso, Daniel Palomar

Neural Information Processing Systems

Learning a graph with a specific structure is essential for interpretability and identification of the relationships among data. It is well known that structured graph learning from observedsamples isanNP-hard combinatorial problem. In this paper, we first show that for a set of important graph families it is possible toconvertthestructural constraints ofstructure intoeigenvalueconstraints ofthe graph Laplacianmatrix.